-
Notifications
You must be signed in to change notification settings - Fork 27.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add parameters to make custom backbone for detr #14933
Conversation
@@ -155,6 +155,10 @@ def __init__( | |||
bbox_loss_coefficient=5, | |||
giou_loss_coefficient=2, | |||
eos_coefficient=0.1, | |||
in_chans=3, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in_chans=3, | |
num_channels=3, |
I would rename this to num_channels to be consistent with other models in the library (like ViT).
@@ -155,6 +155,10 @@ def __init__( | |||
bbox_loss_coefficient=5, | |||
giou_loss_coefficient=2, | |||
eos_coefficient=0.1, | |||
in_chans=3, | |||
pretrained=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
pretrained=True, | |
use_pretrained_backbone=True, |
This can be renamed to use_pretrained_backbone
, for clarity.
@@ -155,6 +155,10 @@ def __init__( | |||
bbox_loss_coefficient=5, | |||
giou_loss_coefficient=2, | |||
eos_coefficient=0.1, | |||
in_chans=3, | |||
pretrained=True, | |||
freeze_layers=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would leave away this one, as there's already a method one can call on DetrModel
, called freeze_backbone
as seen here.
Maybe we can improve its documentation for visibility.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this parameter is more specific, it disables freezing 2-4 layer of resnet-50 (this was just hard coded into encoder initialisation), but I can change it if you think freeze_backbone is better anyway
in_chans=3, | ||
pretrained=True, | ||
freeze_layers=True, | ||
fix_batch_norm=True, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the reason you want to replace the frozen batch norm layers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Training from scratch, I want to try training from fully randomly initialised model. Also I don't have pretrained backbone for my problem anyway, so I think this parameter won't harm
17a2e7b
to
23986a3
Compare
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
What does this PR do?
Added few parameters to be able to create custom backbone models for different use cases (partially explained in mentioned issue)
Closes #14875
Before submitting
Pull Request section?
to it if that's the case.
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.
Maybe @LysandreJik can help with review?